Conversation
ravitch
left a comment
There was a problem hiding this comment.
So I have to ask - this could be accomplished with a shell script calling hey for each URL, right? Without maintaining a fork?
| record, err := r.Read() | ||
| nurl := record[0] | ||
| // ignore any strings that aren't formatted correctly | ||
| if strings.Contains(nurl, ".com") == false { |
There was a problem hiding this comment.
If you're trying to check that it's a valid URL, use url.Parse. But you could always just let it fall when it tries to connect.
| } | ||
| r := csv.NewReader(csvfile) | ||
| // TODO remove this after i'm done testing | ||
| break_early := true |
| // TODO remove this after i'm done testing | ||
| break_early := true | ||
| for { | ||
| record, err := r.Read() |
There was a problem hiding this comment.
Need to check err before accessing record, which could be empty.
And make sure you handle EOF so you exit gracefully.
| log.Fatal(err) | ||
| } | ||
| generateRequestAndGetResponse(nurl, method, username, password, *hostHeader, header, dur, bodyAll, num, conc, rateLimit, proxyURL) | ||
| if break_early { |
There was a problem hiding this comment.
Wouldn't bother passing in hostHeader (or any other globally declared flags)
| break_early := true | ||
| for { | ||
| record, err := r.Read() | ||
| nurl := record[0] |
| } | ||
| } | ||
|
|
||
| func generateRequestAndGetResponse(url string, method string, username string, password string, hostHeader string, header http.Header, dur time.Duration, bodyAll []byte, num int, conc int, rateLimit float64, proxyURL *gourl.URL) { |
There was a problem hiding this comment.
This seems like it would repeat w.Init/w.Run for each CSV line: the approach should probably keep the initialization and start of workers as the main single execution (e.g. like worker.py in GKEmail).
I've put together an alternative hack in rakyll#245 that spins off a reader goroutine that feeds the workers urls through a shared channel.
Forked the hey repo and trying to get it working with a CSV so that we can pass csv with urls to hit.
This is my first time playing in GO so there's definitely need for improvement. As of now, I think this will only work with CSVs, which isn't great. But before fixing that, I wanted to get some feedback about if this is what we wanted.
One thing I haven't been able to make work is getting the results in a CSV. Once I get that going, we'll need to fix the way in which this does things to produce one CSV with all of the results rather than a CSV for each url.
To test this I have been running
go run hey.go -file_name /Users/noamsohn/Desktop/shorten.csv